Goto

Collaborating Authors

 energy demand


London Eye architect proposes 14-mile tidal power station off Somerset coast

The Guardian > Energy

West Somerset Lagoon would harness renewable energy for UK's AI boom - and create'iconic' arc around Bristol Channel The architect of the London Eye wants to build a vast tidal power station in a 14-mile arc off the coast of Somerset that could help Britain meet surging electricity demand to power artificial intelligence - and create a new race track to let cyclists skim over the Bristol Channel. Julia Barfield, who designed the Eye and the i360 observation tower in Brighton, is part of a team that has drawn up the £11bn proposal. The proposal comes amid growing concern that rapidly rising use of AI in Britain will drive up carbon emissions unless more renewable energy sources are found. The AI boom is expected to add to sharp increases in demand for electricity across the UK, which the government estimated this month could more than double by 2050. "If the decision is to go ahead with adopting more and more AI - which I am surprised is not being questioned more at a time of climate emergency - then it is going to be better with a renewable energy source," said Barfield.


China Dives in on the World's First Wind-Powered Undersea Data Center

WIRED

The $226 million project uses ocean breezes and seawater to stay cool. China is submerging data centers into the ocean to keep them cool. China has completed the first phase of construction of what it claims is the world's first underwater data center (UDC). Located in Shanghai's Lin-gang Special Area with a price tag of roughly RMB 1.6 billion ($226 million), it's a significant milestone in the quest for sustainable solutions to the growing energy demands of China's computing infrastructure. Powered entirely by wind energy, the initiative has a total power capacity of 24 megawatts.


Reward scheme for using less power at peak times could help lower US bills

The Guardian > Energy

With AI datacenters soaring power bills for households, a policy called'demand flexibility' could help ease grid strain A cheap, bipartisan tool could help the US meet increasing energy demand from AI datacenters while also easing soaring power bills for households, preventing deadly blackouts and helping the climate. The policy solution, called "demand flexibility", can be quickly deployed across the US. Demand flexibility essentially means rewarding customers for using less power during times of high demand, reducing strain on the grid or in some cases, selling energy they have captured by solar panels on their homes. Peak power demand is expected to grow by 20% over the next decade - driven by the dramatic rise of AI datacenters, onshoring of manufacturing, increasing use of EVs and growing need for air conditioning amid hotter summers. Increasing energy demand is putting states such as California and Texas at higher risk of life-threatening blackouts in extreme weather.


3 takeaways about climate tech right now

MIT Technology Review

What our latest list of Climate Tech Companies to Watch says about this moment. On Monday, we published our 2025 edition of Climate Tech Companies to Watch . This marks the third time we've put the list together, and it's become one of my favorite projects to work on every year. In the journalism world, it's easy to get caught up in the latest news, whether it's a fundraising round, research paper, or startup failure. Curating this list gives our team a chance to take a step back and consider the broader picture. What industries are making progress or lagging behind?


The Download: computing's bright young minds, and cleaning up satellite streaks

MIT Technology Review

Each year, MIT Technology Review honors 35 outstanding people under the age of 35 who are driving scientific progress and solving tough problems in their fields. Today we want to introduce you to the computing innovators on the list who are coming up with new AI chips and specialized datasets--along with smart ideas about how to assess advanced systems for safety. Earlier this year, the $800 million Vera Rubin Observatory commenced its decade-long quest to create an extremely detailed time-lapse movie of the universe. Rubin is capable of capturing many more stars than any other astronomical observatory ever built; it also sees many more satellites. Up to 40% of images captured by the observatory within its first 10 years of operation will be marred by their sunlight-reflecting streaks. Meredith Rawls, a research scientist at the telescope's flagship observation project, Vera Rubin's Legacy Survey of Space and Time, is one of the experts tasked with protecting Rubin's science mission from the satellite blight.


Three big things we still don't know about AI's energy burden

MIT Technology Review

Three big things we still don't know about AI's energy burden AI companies are revealing the one number that researchers have long sought. Earlier this year, when my colleague Casey Crownhart and I spent six months researching the climate and energy burden of AI, we came to see one number in particular as our white whale: how much energy the leading AI models, like ChatGPT or Gemini, use up when generating a single response. This fundamental number remained elusive even as the scramble to power AI escalated to the White House and the Pentagon, and as projections showed that in three years AI could use as much electricity as 22% of all US households. The problem with finding that number, as we explain in our piece published in May, was that AI companies are the only ones who have it. We pestered Google, OpenAI, and Microsoft, but each company refused to provide its figure. Researchers we spoke to who study AI's impact on energy grids compared it to trying to measure the fuel efficiency of a car without ever being able to drive it, making guesses based on rumors of its engine size and what it sounds like going down the highway.


Optimized Renewable Energy Planning MDP for Socially-Equitable Electricity Coverage in the US

Kinnarkar, Riya, Arief, Mansur

arXiv.org Artificial Intelligence

Traditional power grid infrastructure presents significant barriers to renewable energy integration and perpetuates energy access inequities, with low-income communities experiencing disproportionately longer power outages. This study develops a Markov Decision Process (MDP) framework to optimize renewable energy allocation while explicitly addressing social equity concerns in electricity distribution. The model incorporates budget constraints, energy demand variability, and social vulnerability indicators across eight major U.S. cities to evaluate policy alternatives for equitable clean energy transitions. Numerical experiments compare the MDP-based approach against baseline policies including random allocation, greedy renewable expansion, and expert heuristics. Results demonstrate that equity-focused optimization can achieve 32.9% renewable energy penetration while reducing underserved low-income populations by 55% compared to conventional approaches. The expert policy achieved the highest reward, while the Monte Carlo Tree Search baseline provided competitive performance with significantly lower budget utilization, demonstrating that fair distribution of clean energy resources is achievable without sacrificing overall system performance and providing ways for integrating social equity considerations with climate goals and inclusive access to clean power infrastructure.


Can 'ice batteries' cool down our soaring energy demands?

Popular Science

Breakthroughs, discoveries, and DIY tips sent every weekday. Researchers at Texas A&M University are perfecting a deceptively simple solution to our increasingly overburdened energy grid: ice-cooled buildings. This approach, known as thermal energy storage or sometimes referred to colloquially as "ice batteries," uses energy to freeze liquid overnight, when most people are asleep and electricity demand is lower. That stored ice is then melted to help cool building temperatures during peak hours. If successful, the end result is reduced electricity use for air conditioning during the day, which could decrease overall energy demand and help lower costs.

  Country:
  Genre: Research Report (0.35)
  Industry:

In a first, Google has released data on how much energy an AI prompt uses

MIT Technology Review

Earlier this year, MIT Technology Review published a comprehensive series on AI and energy, at which time none of the major AI companies would reveal their per-prompt energy usage. Google's new publication, at last, allows for a peek behind the curtain that researchers and analysts have long hoped for. The study focuses on a broad look at energy demand, including not only the power used by the AI chips that run models but also by all the other infrastructure needed to support that hardware. "We wanted to be quite comprehensive in all the things we included," said Jeff Dean, Google's chief scientist, in an exclusive interview with MIT Technology Review about the new report. Another large portion of the energy is used by equipment needed to support AI-specific hardware: The host machine's CPU and memory account for another 25% of the total energy used.


The Hidden Costs of AI: A Review of Energy, E-Waste, and Inequality in Model Development

Winsta, Jenis

arXiv.org Artificial Intelligence

Artificial intelligence (AI) has made remarkable progress in recent years, yet its rapid expansion brings overlooked environmental and ethical challenges. This review explores four critical areas where AI's impact extends beyond performance: energy consumption, electronic waste (e-waste), inequality in compute access, and the hidden energy burden of cybersecurity systems. Drawing from recent studies and institutional reports, the paper highlights systemic issues such as high emissions from model training, rising hardware turnover, global infrastructure disparities, and the energy demands of securing AI. By connecting these concerns, the review contributes to Responsible AI discourse by identifying key research gaps and advocating for sustainable, transparent, and equitable development practices. Ultimately, it argues that AI's progress must align with ethical responsibility and environmental stewardship to ensure a more inclusive and sustainable technological future.